Monotonicity of Entropy and Fisher Information: A Quick Proof via Maximal Correlation

نویسنده

  • Thomas A. Courtade
چکیده

A simple proof is given for the monotonicity of entropy and Fisher information associated to sums of i.i.d. random variables. The proof relies on a characterization of maximal correlation for partial sums due to Dembo, Kagan and Shepp.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

ar X iv : c s / 06 05 04 7 v 1 [ cs . I T ] 1 1 M ay 2 00 6 Generalized Entropy Power Inequalities and Monotonicity Properties of Information ∗

New families of Fisher information and entropy power inequalities for sums of independent random variables are presented. These inequalities relate the information in the sum of n independent random variables to the information contained in sums over subsets of the random variables, for an arbitrary collection of subsets. As a consequence, a simple proof of the monotonicity of information in ce...

متن کامل

A Proof of the Fisher Information Inequality via a Data Processing Argument - Information Theory, IEEE Transactions on

The Fisher information J(X) of a random variable X under a translation parameter appears in information theory in the classical proof of the Entropy-Power Inequality (EPI). It enters the proof of the EPI via the De-Bruijn identity, where it measures the variation of the differential entropy under a Gaussian perturbation, and via the convolution inequality J(X + Y ) 1 J(X) 1 + J(Y ) 1 (for indep...

متن کامل

Sum Formula for Maximal Abstract Monotonicity and Abstract Rockafellar’s Surjectivity Theorem

In this paper, we present an example in which the sum of two maximal abstract monotone operators is maximal. Also, we shall show that the necessary condition for Rockafellar’s surjectivity which was obtained in ([19], Theorem 4.3) can be sufficient.

متن کامل

A Proof of the Fisher Information Inequality via a Data Processing Argument

The Fisher information J(X) of a random variable X under a translation parameter appears in information theory in the classical proof of the Entropy-Power Inequality (EPI). It enters the proof of the EPI via the De-Bruijn identity, where it measures the variation of the diierential entropy under a Gaussian perturbation, and via the convolution inequality J(X + Y) ?1 J(X) ?1 +J(Y) ?1 (for indepe...

متن کامل

Cramér-Rao and moment-entropy inequalities for Renyi entropy and generalized Fisher information

The moment-entropy inequality shows that a continuous random variable with given second moment and maximal Shannon entropy must be Gaussian. Stam’s inequality shows that a continuous random variable with given Fisher information and minimal Shannon entropy must also be Gaussian. The CramérRao inequality is a direct consequence of these two inequalities. In this paper the inequalities above are ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1610.04174  شماره 

صفحات  -

تاریخ انتشار 2016